82 research outputs found

    An Evaluation of Three Distance Measurement Technologies for Flying Light Specks

    Full text link
    This study evaluates the accuracy of three different types of time-of-flight sensors to measure distance. We envision the possible use of these sensors to localize swarms of flying light specks (FLSs) to illuminate objects and avatars of a metaverse. An FLS is a miniature-sized drone configured with RGB light sources. It is unable to illuminate a point cloud by itself. However, the inter-FLS relationship effect of an organizational framework will compensate for the simplicity of each individual FLS, enabling a swarm of cooperating FLSs to illuminate complex shapes and render haptic interactions. Distance between FLSs is an important criterion of the inter-FLS relationship. We consider sensors that use radio frequency (UWB), infrared light (IR), and sound (ultrasonic) to quantify this metric. Obtained results show only one sensor is able to measure distances as small as 1 cm with a high accuracy. A sensor may require a calibration process that impacts its accuracy in measuring distance.Comment: In International Conference on Intelligent Metaverse Technologies and Applications (iMETA2023), Tartu, Estonia, September 18-20, 202

    Development and Evaluation of a Learning-based Model for Real-time Haptic Texture Rendering

    Full text link
    Current Virtual Reality (VR) environments lack the rich haptic signals that humans experience during real-life interactions, such as the sensation of texture during lateral movement on a surface. Adding realistic haptic textures to VR environments requires a model that generalizes to variations of a user's interaction and to the wide variety of existing textures in the world. Current methodologies for haptic texture rendering exist, but they usually develop one model per texture, resulting in low scalability. We present a deep learning-based action-conditional model for haptic texture rendering and evaluate its perceptual performance in rendering realistic texture vibrations through a multi part human user study. This model is unified over all materials and uses data from a vision-based tactile sensor (GelSight) to render the appropriate surface conditioned on the user's action in real time. For rendering texture, we use a high-bandwidth vibrotactile transducer attached to a 3D Systems Touch device. The result of our user study shows that our learning-based method creates high-frequency texture renderings with comparable or better quality than state-of-the-art methods without the need for learning a separate model per texture. Furthermore, we show that the method is capable of rendering previously unseen textures using a single GelSight image of their surface.Comment: 10 pages, 8 figure

    The Penn Haptic Texture Toolkit for Modeling, Rendering, and Evaluating Haptic Virtual Textures

    Get PDF
    The Penn Haptic Texture Toolkit (HaTT) is a collection of 100 haptic texture and friction models, the recorded data from which the models were made, images of the textures, and the code and methods necessary to render these textures using an impedance-type haptic device such as a SensAble Phantom Omni. This toolkit was developed to provide haptics researchers with a method by which to compare and validate their texture modeling and rendering methods. The included rendering code has the additional benefit of allowing others, both researchers and designers, to incorporate our textures into their virtual environments, which will lead to a richer experience for the user

    Investigating Social Haptic Illusions for Tactile Stroking (SHIFTS)

    Full text link
    A common and effective form of social touch is stroking on the forearm. We seek to replicate this stroking sensation using haptic illusions. This work compares two methods that provide sequential discrete stimulation: sequential normal indentation and sequential lateral skin-slip using discrete actuators. Our goals are to understand which form of stimulation more effectively creates a continuous stroking sensation, and how many discrete contact points are needed. We performed a study with 20 participants in which they rated sensations from the haptic devices on continuity and pleasantness. We found that lateral skin-slip created a more continuous sensation, and decreasing the number of contact points decreased the continuity. These results inform the design of future wearable haptic devices and the creation of haptic signals for effective social communication.Comment: To be published in IEEE Haptics Symposium 202

    Refined Methods for Creating Realistic Haptic Virtual Textures from Tool-Mediated Contact Acceleration Data

    Get PDF
    Dragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying one’s scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the user’s current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts

    Dronevision: An Experimental 3D Testbed for Flying Light Specks

    Full text link
    Today's robotic laboratories for drones are housed in a large room. At times, they are the size of a warehouse. These spaces are typically equipped with permanent devices to localize the drones, e.g., Vicon Infrared cameras. Significant time is invested to fine-tune the localization apparatus to compute and control the position of the drones. One may use these laboratories to develop a 3D multimedia system with miniature sized drones configured with light sources. As an alternative, this brave new idea paper envisions shrinking these room-sized laboratories to the size of a cube or cuboid that sits on a desk and costs less than 10K dollars. The resulting Dronevision (DV) will be the size of a 1990s Television. In addition to light sources, its Flying Light Specks (FLSs) will be network-enabled drones with storage and processing capability to implement decentralized algorithms. The DV will include a localization technique to expedite development of 3D displays. It will act as a haptic interface for a user to interact with and manipulate the 3D virtual illuminations. It will empower an experimenter to design, implement, test, debug, and maintain software and hardware that realize novel algorithms in the comfort of their office without having to reserve a laboratory. In addition to enhancing productivity, it will improve safety of the experimenter by minimizing the likelihood of accidents. This paper introduces the concept of a DV, the research agenda one may pursue using this device, and our plans to realize one

    A report of metastatic squamous cell carcinoma in a Matamata turtle (Chelus fimbriatus)

    Full text link
    This case report describes a 38 year old wild caught male Matamata turtle housed at the Rosamond-Gifford Zoo that presented with a mass located on the dorsal aspect of the right rear limb. The turtle had been hypophagic for the past 4 weeks. Upon physical exam the turtle was weak but had not lost a significant amount of weight since the previous year (10% body weight). There was a 4 cm diameter mass on the dorsal aspect of the right thigh and stifle that seemed to have invaded into the adjacent carapace. Differential diagnoses for turtle shell lesions include infectious diseases including bacterial, viral, and fungal diseases, trauma with subsequent infection and, rarely neoplasia. Our diagnostic and treatment plan included blood collection, whole body radiographs, and ultrasound of the mass. The mass was debrided and samples of the abnormal tissues were collected and submitted for histopathologic examination, bacterial and fungal cultures. Until the results of the biopsy and culture were obtained, the wound was managed with lavage and daily bandage changes. The turtle was treated with enrofloxacin, SQ fluids, and ketoprofen. The biopsy revealed squamous cell carcinoma. Due to the Matamata's poor prognosis euthanasia was elected. The necropsy revealed metastatic squamous cell carcinoma and mild bacterial sepsis. This is only the second report of squamous cell carcinoma in turtles

    Data-driven haptic modeling and rendering of realistic virtual textured surfaces

    Get PDF
    The haptic sensations one feels when interacting with physical objects create a rich and varied impression of the objects, allowing one to gather information about texture, shape, compressibility, and other physical characteristics. The human sense of touch excels at sensing and interpreting these haptic cues, even when the object is felt through an intermediary tool instead of directly with a bare finger. Dragging, pressing, and tapping a tool on the object allow you to sense the object\u27s roughness, slipperiness, and hardness as a combination of vibrations and forces. Unfortunately, the richness of these interaction cues is missing from many virtual environments, leading to a less satisfying and less immersive experience than one encounters in the physical world. However, we can create the perceptual illusion of touching a real object by displaying the appropriate haptic signals during virtual interactions. This thesis presents methods for creating haptic models of textured surfaces from acceleration, force, and speed data recorded during physical interactions. The models are then used to synthesize haptic signals that are displayed to the user during rendering through vibrotactile and/or kinesthetic feedback. The haptic signals, which are a function of the interaction conditions and motions used during rendering, must respond realistically to the user\u27s motions in the virtual environment. We conducted human subject studies to test how well our virtual surfaces capture the psychophysical dimensions humans perceive when exploring textured surfaces with a tool. Three haptic rendering systems were created for displaying virtual surfaces using these surface models. An initial system displayed virtual versions of textured surfaces on a tablet computer using models of the texture vibrations induced when dragging a tool across the real surfaces. An evaluation of the system showed that displaying the texture vibrations accurately captured the surface\u27s roughness, but additional modeling and rendering considerations were needed to capture the full feel of the surface. Using these results, a second system was created for rendering a more complete three-dimensional version of the haptic surfaces including surface friction and event-based tapping transients in addition to the texture vibrations. An evaluation of this system showed that we have created the most realistic haptic surfaces to date. The force-feedback haptic device used in this system, however, was not without its limitations, including low surface stiffness and undesired inertia and friction. We developed an ungrounded haptic augmented reality system to overcome these limitations. This system allowed us to change the perceived texture and friction of a physical three-dimensional object using the previously-developed haptic surface models
    • …
    corecore